Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add Ollama-webui package and service for Mixtral #275448

Closed
wants to merge 2 commits into from

Conversation

malteneuss
Copy link
Contributor

@malteneuss malteneuss commented Dec 19, 2023

Description of changes

Related to #273556:

  • Add Ollama-WebUI package that mimics ChatGPT's frontend and integrates nicely with existing Ollama package.
  • Extend the Ollama service module for NixOS with options for home lab deployments
  • Add Ollama-WebUI service module for NixOS

Background:
To make open-source large langue models (LLM) accessible there are projects like Ollama that make it almost trivial to download and run them locally on a consumer computer.
We already have Ollama in Nixpkgs, but that can only be run conveniently in a terminal (and doesn't store previous chats). What's missing is a web UI, e.g. Ollama-WebUI that mimics ChatGPT's frontend and integrates nicely with Ollama.

Things done

  • Built on platform(s)
    • x86_64-linux
    • aarch64-linux
    • x86_64-darwin
    • aarch64-darwin
  • For non-Linux: Is sandboxing enabled in nix.conf? (See Nix manual)
    • sandbox = relaxed
    • sandbox = true
  • Tested, as applicable:
  • Tested compilation of all packages that depend on this change using nix-shell -p nixpkgs-review --run "nixpkgs-review rev HEAD". Note: all changes have to be committed, also see nixpkgs-review usage
  • Tested basic functionality of all binary files (usually in ./result/bin/)
  • 24.05 Release Notes (or backporting 23.05 and 23.11 Release notes)
    • (Package updates) Added a release notes entry if the change is major or breaking
    • (Module updates) Added a release notes entry if the change is significant
    • (Module addition) Added a release notes entry if adding a new NixOS module
  • Fits CONTRIBUTING.md.

Add a 👍 reaction to pull requests you find important.

@NixOSInfra NixOSInfra added the 12. first-time contribution This PR is the author's first one; please be gentle! label Dec 19, 2023
@eclairevoyant eclairevoyant added the 2.status: work-in-progress This PR isn't done label Dec 19, 2023
@nixos-discourse
Copy link

This pull request has been mentioned on NixOS Discourse. There might be relevant details there:

https://discourse.nixos.org/t/how-to-package-static-single-page-nodejs-webapp-ollama-webui/37074/1

@ofborg ofborg bot added the 2.status: merge conflict This PR has merge conflicts with the target branch label Dec 19, 2023
@malteneuss malteneuss force-pushed the ollama-mixtral-services branch 8 times, most recently from a4d4021 to 13003f5 Compare December 22, 2023 21:00
@github-actions github-actions bot added 6.topic: nixos Issues or PRs affecting NixOS modules, or package usability issues specific to NixOS 8.has: module (update) This PR changes an existing module in `nixos/` labels Dec 22, 2023
@ofborg ofborg bot added 8.has: package (new) This PR adds a new package 11.by: package-maintainer This PR was created by the maintainer of the package it changes 10.rebuild-darwin: 1-10 10.rebuild-darwin: 1 10.rebuild-linux: 1-10 10.rebuild-linux: 1 and removed 2.status: merge conflict This PR has merge conflicts with the target branch labels Dec 22, 2023
@malteneuss malteneuss force-pushed the ollama-mixtral-services branch 4 times, most recently from 9bb3767 to e07509d Compare December 22, 2023 22:57
@trzpiot
Copy link
Contributor

trzpiot commented Jan 10, 2024

Hi @malteneuss, thanks for packaging Ollama Web UI. 💙 In the meantime, Ollama has been added as a service (but I haven't tested the service yet).1 What's missing for bringing the Ollama Web UI to Nixpkgs? 😊

Footnotes

  1. https://github.com/NixOS/nixpkgs/blob/nixos-unstable/nixos/modules/services/misc/ollama.nix

@mschwaig
Copy link
Member

As-is this does not build, but I have been running this PR with a few changes to get it to build locally for a few days and it's been working well. I have not looked at what would be required to land it in nixpkgs.
https://github.com/mschwaig/nixpkgs/tree/ollama-mixtral-services

To make open-source large langue models (LLM) accessible there are
projects like Ollama that make it almost trivial to download and
run them locally on a consumer computer. We already have Ollama in
Nixpkgs, but that can only be run conveniently in a terminal
(and doesn't store previous chats). What's missing is a web UI,
e.g. Ollama-WebUI that mimics ChatGPT's frontend and integrates nicely
with Ollama.
Make it convenient to setup Ollama-Webui on a server to run
large language models (LLM).
@malteneuss malteneuss force-pushed the ollama-mixtral-services branch from e07509d to d48979f Compare January 12, 2024 21:25
@malteneuss
Copy link
Contributor Author

@trzpiot I wanted to add an automatic NixOS test (since the ollama-webui needs to closely follow ollama; otherwise it broke a few times), but i haven't had time to learn to setup this up. Maybe a manual test would suffice for now to get it merged. Thanks for redirecting me to the existing ollama service (someone was faster ;) but i still would like to add some more knobs to make Ollama be deployable to a home lab server).

@mschwaig Thanks for testing it out. I pushed similar fixes to yours into the MR. I will try to test the setup during the weekend. Maybe you could do the same?

@mschwaig
Copy link
Member

I am running this branch since yesterday.

Two things that I noticed were

  1. that my existing model files disappeared. Probably this happens because I got ollama from the previous version of this branch before, and now I'm getting the one that got merged in the meantime and they store their models in different locations. I don't think that's an issue.
  2. that long prompts made the connection between ollama-web-ui and ollama fail. I did not have enough time to look into this issue. If I have time I will test if the same thing also happens with oterm from this branch. I don't think thats an issue with this branch either.

Overall it seems to be running fine.

When you have added the extra config options that you still wanted to add I think you could remove the draft flag from this PR and at that point we can get someone to take a look who feels confident in reviewing the systemd services configs.


ollama-webui-package = mkPackageOption pkgs "ollama-webui" { };

host = mkOption {
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

this host isn't used anywhere, is that intentional ?

in {
ExecStart = "${cfg.ollama-webui-package}/bin/ollama-webui --port ${toString cfg.port} ${cors-arg}";
DynamicUser = "true";
Type = "simple";
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

just for information the default type is simple for services is simple. I'm not sure how I feel about having it written explicitely, if you prefer it that way, feel free to keep it.

npmDepsHash = "sha256-SI2dPn1SwbGwl8093VBtcDsA2eHSxr3UUC+ta68w2t8=";

# We have to bake in the default URL it will use for ollama webserver here,
# but it can be overriden in the UI later.
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

when you say it can be overridden later, does that mean inside the module ?
just wondering if you still need to add the override.

Optional: This module is configured to run locally, but can be served from a (home) server,
ideally behind a secured reverse-proxy.
Look at <https://nixos.wiki/wiki/Nginx> or <https://nixos.wiki/wiki/Caddy>
on how to set up a reverse proxy.
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

do you think referencing the caddy file of the original repo would be a good idea ?
https://github.com/ollama-webui/ollama-webui/blob/main/Caddyfile.localhost

PUBLIC_API_BASE_URL = "http://localhost:11434/api";

# The path '/ollama/api' will be redirected to the specified backend URL
OLLAMA_API_BASE_URL = PUBLIC_API_BASE_URL;
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

the env var only seems to be set after the build has succeeded
https://github.com/ollama-webui/ollama-webui/blob/main/Dockerfile#L22
does the build fail without ?
if not, this is probably something that should be set in the service, not in the package.

mkdir -p $out/lib
cp -R ./build/. $out/lib

mkdir -p $out/bin
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

you don't need those last instructions.
I could be wrong, but if this is a standard node project, you just need to package the build directory
then in the service you can use node directly to just run the build directory
here are some reference of how we did this with lemmy

name = "lemmy-ui";

cfg = config.services.lemmy;

second point and this is entirely optional.
rather than making a complety separate service for ollama-webui, how about including it in the original ollama service.
I don't think it makes sense to have ollama-webui as a standalone for now.
(you can take inspiration in the lemmy module for what we did if you find stuff that you like).

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

just a bit more context on not providing an executable for a node package.
the main reason is that without several environment variable set, the binary is just not runnable. So most of the time you package the build directory and in the service you provide all the necessary to run it.

if you prefer to provide a binary, I respect your decision, in that case, you should probably use makeWrapper, you can look at jellyseerr for an example (you'll find many more if you don't like this particular one).

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

second point and this is entirely optional.
rather than making a complety separate service for ollama-webui, how about including it in the original ollama service.
I don't think it makes sense to have ollama-webui as a standalone for now.
(you can take inspiration in the lemmy module for what we did if you find stuff that you like).

Since ollama-webui is a community project and not officially associated with ollama (see the top of their README.md), I tend to think it does not make sense to have the ollama-webui service live under ollama as ollama.ui or ollama.web-ui.

There are a bunch of other frontends that people might want to use with the ollama service, such as oterm.

@h7x4 h7x4 added the 8.has: module (new) This PR adds a module in `nixos/` label Jan 18, 2024
@happysalada
Copy link
Contributor

hey, I've provided a couple of comments, but all in all, this is very nice! Thank you for your contribution. I'm here to help, so let me know if anything!

@wochap
Copy link
Contributor

wochap commented Jan 22, 2024

Hi, I just used your branch, and I noticed two things:

  1. The services.ollama-webui.port doesn't work as expected. This is because the http-server within your script wrapper does not receive the arguments (--port). To fix this, add "$@" at the end of http-server within the script wrapper. For e.g.: https://github.com/wochap/nix-config/blob/dev/packages/ollama-webui.nix#L33

  2. By default, http-server does not handle Single Page Applications well. If you try to access http://localhost:8080/c/<chat_id> and refresh, it will result in an error. A workaround is to add --proxy http://${host}:${port}? to http-server, more info https://github.com/http-party/http-server?tab=readme-ov-file#catch-all-redirect. However, it would be ideal to use vite --preview --outDir <path_to_ollama_webui_dist_folder>.

Thanks for creating this PR!

@mschwaig
Copy link
Member

mschwaig commented Mar 4, 2024

@malteneuss are you still interested in/do you still have time to move this forward? Would it be OK for someone else to pick it up?

For those who are interested in this functionality, I have a branch with a rough update of this branch to a more recent version of what's now called open-webui: https://github.com/mschwaig/nixpkgs/tree/open-webui
That branch is still in rough shape. I have no immediate plans to work on a PR (though that might change). Feel free to use it in this or another PR.

To me it looks like those more recent versions require also packaging some python (i think mostly for RAG), which blows up the scope a bit. Especially since some dependencies have not landed in nixpkgs yet.
I am also a bit concerned about some config options for privacy shipped as part of the Dockerfile, which is the kind of thing we would have to really pay attention to when updating this service: https://github.com/open-webui/open-webui/blob/eb51ad14e4caafda1c9fd24c4945044b8776a7a3/Dockerfile#L30C1-L31C22

@wegank wegank added the 2.status: merge conflict This PR has merge conflicts with the target branch label Mar 20, 2024
@malteneuss
Copy link
Contributor Author

@happysalada @mschwaig Thanks for the review comments and pushing this topic further. I've been busy with my full-time job and a two year old, and have little to no time to move this forward in the next months. Would your branch be stable and usable enough to be merged and improved in smaller steps?

Comment on lines +80 to +81
HOME = "%S/ollama";
OLLAMA_MODELS = "%S/ollama/models";
Copy link
Contributor

@Kreyren Kreyren May 7, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Isn't it better to use /var/lib/private/ollama instead of the %s so that it's more functional to work with the variables?

image

^ Having to hard-code the path as the expansion doesn't seem to work like other services

@Kreyren
Copy link
Contributor

Kreyren commented May 7, 2024

@happysalada @mschwaig Thanks for the review comments and pushing this topic further. I've been busy with my full-time job and a two year old, and have little to no time to move this forward in the next months. Would your branch be stable and usable enough to be merged and improved in smaller steps? -- @malteneuss (#275448 (comment))

Is this still on-going?

  • yes -> Please share the current expected status so that i know where to start from as i might have time to finish this MR
  • no -> cool cool

@mschwaig
Copy link
Member

mschwaig commented May 7, 2024

@Kreyren

As far as I remember the chat functionality is working on my branch (linked above).
I have not tested that RAG is working yet, which is the feature that made adding all that python necessary.
Some of the python dependencies had to be packaged, some had open PRs which might have been closed in the mean time.

I do not have a lot of time to look into this in the next two weeks.
My next steps would have been to take another look at my branch with the goal of opening a draft PR once I get to it, maybe rebase onto recent master and try to build a newer version of the package, but if someone else wants to pick this up and open a PR in the mean time, go ahead.

I would recommend that you use my branch as a starting point.

@malteneuss
Copy link
Contributor Author

Already done with #316248

@malteneuss malteneuss closed this Jun 8, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
2.status: merge conflict This PR has merge conflicts with the target branch 2.status: work-in-progress This PR isn't done 6.topic: nixos Issues or PRs affecting NixOS modules, or package usability issues specific to NixOS 8.has: module (new) This PR adds a module in `nixos/` 8.has: module (update) This PR changes an existing module in `nixos/` 8.has: package (new) This PR adds a new package 10.rebuild-darwin: 1-10 10.rebuild-darwin: 1 10.rebuild-linux: 1-10 11.by: package-maintainer This PR was created by the maintainer of the package it changes 12. first-time contribution This PR is the author's first one; please be gentle!
Projects
None yet
Development

Successfully merging this pull request may close these issues.